- Knowledge Data Transformation
- Computers: KDT
Универсальный русско-английский словарь. Академик.ру. 2011.
Универсальный русско-английский словарь. Академик.ру. 2011.
Data conversion — is the conversion of computer data from one format to another. Throughout a computer environment, data is encoded in a variety of ways. For example, computer hardware is built on the basis of certain standards, which requires that data contains,… … Wikipedia
Data mining — Not to be confused with analytics, information extraction, or data analysis. Data mining (the analysis step of the knowledge discovery in databases process,[1] or KDD), a relatively young and interdisciplinary field of computer science[2][3] is… … Wikipedia
Knowledge Discovery Metamodel — (KDM) is publicly available specification from the Object Management Group (OMG). KDM is a common intermediate representation for existing software systems and their operating environments, that defines common metadata required for deep semantic… … Wikipedia
Knowledge-based engineering — (KBE) is a discipline with roots in computer aided design (CAD) and knowledge based systems but has several definitions and roles depending upon the context. An early role was support tool for a design engineer generally within the context of… … Wikipedia
Data presentation architecture — (DPA) is a skill set that seeks to identify, locate, manipulate, format and present data in such a way as to optimally communicate meaning and proffer knowledge. Contents 1 Origin and context 2 Objectives 3 Scope 4 … Wikipedia
Data exchange — is the process of taking data structured under a source schema and actually transforming it into data structured under a target schema, so that the target data is an accurate representation of the source data[citation needed]. Data exchange is… … Wikipedia
Data driven journalism — is a journalistic process based on analyzing and filtering large data sets for the purpose of creating a new story. Data driven journalism deals with open data that is freely available online and analyzed with open source tools.[1] Data driven… … Wikipedia
Data Encryption Standard — The Feistel function (F function) of DES General Designers IBM First publis … Wikipedia
Data Web — refers to a government open source project that was started in 1995 to develop open source framework that networks distributed statistical databases together into a seamless unified virtual data warehouse. Originally funded by the U.S. Census… … Wikipedia
Data virtualization — describes the process of abstracting disparate data sources (databases, applications, file repositories, websites, data services vendors, etc.) through a single data access layer (which may be any of several data access mechanisms). This… … Wikipedia
Data Pre-processing — is an often neglected but important step in the data mining process. The phrase Garbage In, Garbage Out is particularly applicable to data mining and machine learning projects. Data gathering methods are often loosely controlled, resulting in out … Wikipedia